153 research outputs found
Measuring questions: relevance and its relation to entropy
The Boolean lattice of logical statements induces the free distributive
lattice of questions. Inclusion on this lattice is based on whether one
question answers another. Generalizing the zeta function of the question
lattice leads to a valuation called relevance or bearing, which is a measure of
the degree to which one question answers another. Richard Cox conjectured that
this degree can be expressed as a generalized entropy. With the assistance of
yet another important result from Janos Aczel, I show that this is indeed the
case, and that the resulting inquiry calculus is a natural generalization of
information theory. This approach provides a new perspective on the Principle
of Maximum Entropy.Comment: 8 pages, 1 figure. Presented to the MaxEnt 2004 meeting in Garching
Germany. To be published in: R. Fischer, V. Dose (eds.), Bayesian Inference
and Maximum Entropy Methods in Science and Engineering, Garching, Germany
2004, AIP Conference Proceedings, American Institute of Physics, Melville N
The Problem of Motion: The Statistical Mechanics of Zitterbewegung
Around 1930, both Gregory Breit and Erwin Schroedinger showed that the
eigenvalues of the velocity of a particle described by wavepacket solutions to
the Dirac equation are simply c, the speed of light. This led Schroedinger
to coin the term Zitterbewegung, which is German for "trembling motion", where
all particles of matter (fermions) zig-zag back-and-forth at only the speed of
light. The result is that any finite speed less than , including the state
of rest, only makes sense as a long-term average that can be thought of as a
drift velocity. In this paper, we seriously consider this idea that the
observed velocities of particles are time-averages of motion at the speed of
light and demonstrate how the relativistic velocity addition rule in one
spatial dimension is readily derived by considering the probabilities that a
particle is observed to move either to the left or to the right at the speed of
light.Comment: Knuth K.H. 2014. The problem of motion: the statistical mechanics of
Zitterbewegung. Bayesian Inference and Maximum Entropy Methods in Science and
Engineering, Amboise, France, Sept 2014, AIP Conference Proceedings, American
Institute of Physics, Melville N
Information Physics: The New Frontier
At this point in time, two major areas of physics, statistical mechanics and
quantum mechanics, rest on the foundations of probability and entropy. The last
century saw several significant fundamental advances in our understanding of
the process of inference, which make it clear that these are inferential
theories. That is, rather than being a description of the behavior of the
universe, these theories describe how observers can make optimal predictions
about the universe. In such a picture, information plays a critical role. What
is more is that little clues, such as the fact that black holes have entropy,
continue to suggest that information is fundamental to physics in general.
In the last decade, our fundamental understanding of probability theory has
led to a Bayesian revolution. In addition, we have come to recognize that the
foundations go far deeper and that Cox's approach of generalizing a Boolean
algebra to a probability calculus is the first specific example of the more
fundamental idea of assigning valuations to partially-ordered sets. By
considering this as a natural way to introduce quantification to the more
fundamental notion of ordering, one obtains an entirely new way of deriving
physical laws. I will introduce this new way of thinking by demonstrating how
one can quantify partially-ordered sets and, in the process, derive physical
laws. The implication is that physical law does not reflect the order in the
universe, instead it is derived from the order imposed by our description of
the universe. Information physics, which is based on understanding the ways in
which we both quantify and process information about the world around us, is a
fundamentally new approach to science.Comment: 17 pages, 6 figures. Knuth K.H. 2010. Information physics: The new
frontier. J.-F. Bercher, P. Bessi\`ere, and A. Mohammad-Djafari (eds.)
Bayesian Inference and Maximum Entropy Methods in Science and Engineering
(MaxEnt 2010), Chamonix, France, July 201
Inferences about Interactions: Fermions and the Dirac Equation
At a fundamental level every measurement process relies on an interaction
where one entity influences another. The boundary of an interaction is given by
a pair of events, which can be ordered by virtue of the interaction. This
results in a partially ordered set (poset) of events often referred to as a
causal set. In this framework, an observer can be represented by a chain of
events. Quantification of events and pairs of events, referred to as intervals,
can be performed by projecting them onto an observer chain, or even a pair of
observer chains, which in specific situations leads to a Minkowski metric
replete with Lorentz transformations. We illustrate how this framework of
interaction events gives rise to some of the well-known properties of the
Fermions, such as Zitterbewegung. We then take this further by making
inferences about events, which is performed by employing the process calculus,
which coincides with the Feynman path integral formulation of quantum
mechanics. We show that in the 1+1 dimensional case this results in the Feynman
checkerboard model of the Dirac equation describing a Fermion at rest.Comment: 11 pages, 3 figures. To be published in the MaxEnt 2012 proceeding
A Bayesian Analysis of HAT-P-7b Using the EXONEST Algorithm
The study of exoplanets (planets orbiting other stars) is revolutionizing the
way we view our universe. High-precision photometric data provided by the
Kepler Space Telescope (Kepler) enables not only the detection of such planets,
but also their characterization. This presents a unique opportunity to apply
Bayesian methods to better characterize the multitude of previously confirmed
exoplanets. This paper focuses on applying the EXONEST algorithm to
characterize the transiting short-period-hot-Jupiter, HAT-P-7b. EXONEST
evaluates a suite of exoplanet photometric models by applying Bayesian Model
Selection, which is implemented with the MultiNest algorithm. These models take
into account planetary effects, such as reflected light and thermal emissions,
as well as the effect of the planetary motion on the host star, such as Doppler
beaming, or boosting, of light from the reflex motion of the host star, and
photometric variations due to the planet-induced ellipsoidal shape of the host
star. By calculating model evidences, one can determine which model best
describes the observed data, thus identifying which effects dominate the
planetary system. Presented are parameter estimates and model evidences for
HAT-P-7b.Comment: Submitted to the conference proceedings for MaxEnt 2014, to be
published by AI
- …